10 research outputs found

    A Study on Learning Representations for Relations Between Words

    Get PDF
    Reasoning about relations between words or entities plays an important role in human cognition. It is thus essential for a computational system which processes human languages to be able to understand the semantics of relations to simulate human intelligence. Automatic relation learning provides valuable information for many natural language processing tasks including ontology creation, question answering and machine translation, to name a few. This need brings us to the topic of this thesis where the main goal is to explore multiple resources and methodologies to effectively represent relations between words. How to effectively represent semantic relations between words remains a problem that is underexplored. A line of research makes use of relational patterns, which are the linguistic contexts in which two words co-occur in a corpus to infer a relation between them (e.g., X leads to Y). This approach suffers from data sparseness because not every related word-pair co-occurs even in a large corpus. In contrast, prior work on learning word embeddings have found that certain relations between words could be captured by applying linear arithmetic operators on the corresponding pre-trained word embeddings. Specifically, it has been shown that the vector offset (expressed as PairDiff) from one word to the other in a pair encodes the relation that holds between them, if any. Such a compositional method addresses the data sparseness by inferring a relation from constituent words in a word-pair and obviates the need of relational patterns. This thesis investigates the best way to compose word embeddings to represent relational instances. A systematic comparison is carried out for unsupervised operators, which in general reveals the superiority of the PairDiff operator on multiple word embedding models and benchmark datasets. Despite the empirical success, no theoretical analysis has been conducted so far explaining why and under what conditions PairDiff is optimal. To this end, a theoretical analysis is conducted for the generalised bilinear operators that can be used to measure the relational distance between two word-pairs. The main conclusion is that, under certain assumptions, the bilinear operator can be simplified to a linear form, where the widely used PairDiff operator is a special case. Multiple recent works raised concerns about existing unsupervised operators for inferring relations from pre-trained word embeddings. Thus, the question of whether it is possible to learn better parametrised relational compositional operators is addressed in this thesis. A supervised relation representation operator is proposed using a non-linear neural network that performs relation prediction. The evaluation on two benchmark datasets reveals that the penultimate layer of the trained neural network-based relational predictor acts as a good representation for the relations between words. Because we believe that both relational patterns and word embeddings provide complementary information to learn relations, a self-supervised context-guided relation embedding method that is trained on the two sources of information has been proposed. Experimentally, incorporating relational contexts shows improvement in the performance of a compositional operator for representing unseen word-pairs. Besides unstructured text corpora, knowledge graphs provide another source for relational facts in the form of nodes (i.e., entities) connected by edges (i.e., relations). Knowledge graphs are employed widely in natural language processing applications such as question answering and dialogue systems. Embedding entities and relations in a graph have shown impressive results for inferring previously unseen relations between entities. This thesis contributes to developing a theoretical model to infer a relationship between the connections in the graph and the embeddings of entities and relations. Learning graph embeddings that satisfy the proven theorem demonstrates efficient performance compared to existing heuristically derived graph embedding methods. As graph embedding methods generate representations for only existing relation types, a relation composition task is proposed in the thesis to tackle this limitation

    Discovering Representative Space for Relational Similarity Measurement

    Get PDF

    Why PairDiff works? -- A Mathematical Analysis of Bilinear Relational Compositional Operators for Analogy Detection

    Get PDF
    Representing the semantic relations that exist between two given words (or entities) is an important first step in a wide-range of NLP applications such as analogical reasoning, knowledge base completion and relational information retrieval. A simple, yet surprisingly accurate method for representing a relation between two words is to compute the vector offset (\PairDiff) between their corresponding word embeddings. Despite the empirical success, it remains unclear as to whether \PairDiff is the best operator for obtaining a relational representation from word embeddings. We conduct a theoretical analysis of generalised bilinear operators that can be used to measure the â„“2\ell_{2} relational distance between two word-pairs. We show that, if the word embeddings are standardised and uncorrelated, such an operator will be independent of bilinear terms, and can be simplified to a linear form, where \PairDiff is a special case. For numerous word embedding types, we empirically verify the uncorrelation assumption, demonstrating the general applicability of our theoretical result. Moreover, we experimentally discover \PairDiff from the bilinear relation composition operator on several benchmark analogy datasets

    Compositional approaches for representing relations between words: A comparative study

    Get PDF
    Identifying the relations that exist between words (or entities) is important for various natural language processing tasks such as, relational search, noun-modifier classification and analogy detection. A popular approach to represent the relations between a pair of words is to extract the patterns in which the words co-occur with from a corpus, and assign each word-pair a vector of pattern frequencies. Despite the simplicity of this approach, it suffers from data sparseness, information scalability and linguistic creativity as the model is unable to handle previously unseen word pairs in a corpus. In contrast, a compositional approach for representing relations between words overcomes these issues by using the attributes of each individual word to indirectly compose a representation for the common relations that hold between the two words. This study aims to compare different operations for creating relation representations from word-level representations. We investigate the performance of the compositional methods by measuring the relational similarities using several benchmark datasets for word analogy. Moreover, we evaluate the different relation representations in a knowledge base completion task

    Learning to Compose Relational Embeddings in Knowledge Graphs

    Get PDF
    Knowledge Graph Embedding methods learn low-dimensional representations for entities and relations in knowledge graphs, which can be used to infer previously unknown relations between pairs of entities in the knowledge graph. This is particularly useful for expanding otherwise sparse knowledge graphs. However, the relation types that can be predicted using knowledge graph embeddings are confined to the set of relations that already exists in the KG. Often the set of relations that exist between two entities are not independent, and it is possible to predict what other relations are likely to exist between two entities by composing the embeddings of the relations in which each entity participates. We introduce relation composition as the task of inferring embeddings for unseen relations by combining existing relations in a knowledge graph. Specifically, we propose a supervised method to compose relational embeddings for novel relations using pre-trained relation embeddings for existing relations. Our experimental results on a previously proposed benchmark dataset for relation composition ranking and triple classification show that the proposed supervised relation composition method outperforms several unsupervised relation composition methods

    RelWalk - A Latent Variable Model Approach to Knowledge Graph Embedding.

    Get PDF
    Embedding entities and relations of a knowledge graph in a low-dimensional space has shown impressive performance in predicting missing links between entities. Although progresses have been achieved, existing methods are heuristically motivated and theoretical understanding of such embeddings is comparatively underdeveloped. This paper extends the random walk model (Arora et al., 2016a) of word embeddings to Knowledge Graph Embeddings (KGEs) to derive a scoring function that evaluates the strength of a relation R between two entities h (head) and t (tail). Moreover, we show that marginal loss minimisation, a popular objective used in much prior work in KGE, follows naturally from the log-likelihood ratio maximisation under the probabilities estimated from the KGEs according to our theoretical relationship. We propose a learning objective motivated by the theoretical analysis to learn KGEs from a given knowledge graph. Using the derived objective, accurate KGEs are learnt from FB15K237 and WN18RR benchmark datasets, providing empirical evidence in support of the theory

    Burnout and quality of life among healthcare professionals during the COVID-19 pandemic in Saudi Arabia

    Get PDF
    Background and Objectives. Healthcare professionals (HCPs) have had to deal with large numbers of confirmed or suspected cases of COVID-19 and were at a high risk of burnout and dissatisfaction regarding their work-life integration. This article aims to assess burnout, the work-life balance (WLB), and quality of life (QoL) among healthcare workers and the relationship between these aspects in Saudi Arabia. Methods. An analytical cross-sectional study was conducted among 491 HCPs from five secondary hospitals in Jazan, Saudi Arabia. Three standardized questionnaires were used to gather data, including WLB, burnout, and the WHO Quality of Life-BREF. Results. Healthcare professionals struggled to balance their work and personal lives during COVID-19 and reported many burnout symptoms and a low level of QoL. Two-thirds (68.8%) of HCPs arrived home late from work and (56.6%) skipped a meal. HCPs who worked through a shift without any breaks were found in 57.8%. It was reported that 39.3% of HCPs felt frustrated by technology while being exhausted from their work (60.5%). The correlation coefficients between the WLB and health-related QoL (HRQoL) showed a significant negative correlation for all items, which ranged from (-.099 to -.403, P<0.05). The WLB and burnout scores were successful predictors of low levels of HRQoL (P<0.001 for both explanatory variables). Conclusions. Work-life imbalances, high levels of burnout, and low QoL levels are common among healthcare professionals in Saudi Arabia during COVID-19. Hospital administration should address the WLB and reduce burnout symptoms among HCPs to increase satisfaction and improve the quality of care

    Table1_Factors influencing the mental health of caregivers of children with cerebral palsy.docx

    No full text
    ObjectivesCaregivers of children with cerebral palsy have a huge burden which might affect their mental health. This study aimed to determine the different factors affecting the mental health of caregivers of children with cerebral palsy and to raise awareness among healthcare providers.MethodsA cross-sectional study was conducted among caregivers of children with cerebral palsy in National Guard Health Affairs-Jeddah, Saudi Arabia, using the Depression Anxiety Stress Scale-21, which is a validated questionnaire that assesses: depression, anxiety, and stress. This questionnaire was used to assess the mental health of the caregivers. In addition, factors that reflected the child's health condition, such as visual impairment, number of emergency department visits, and number of Pediatric Intensive Care Unit admissions were also reported to investigate the impact on the caregiver's mental health.ResultsThe study sample consisted of 40 caregivers, of which 72.5% were mothers. According to the Depression Anxiety Stress Scale-21 score, 12.5% (n = 5) of the caregivers had moderate depression scores, 10% (n = 4) revealed extremely severe depression, and 10% (n = 4) showed moderate anxiety. Furthermore, 12.5% (n = 5), 15% (n = 6), and 7.5% (n = 3) of the caregivers have scored as moderate, severe, and extremely severe stress levels, respectively. Caregivers’ depression, anxiety, and stress scores were significantly (p ≤ 0.05) associated with the impact of vision of their dependent children, frequent hospital admissions, and frequent emergency department visits. Increased Pediatric Intensive Care Unit admissions in the past year were also significantly associated with higher caregiver anxiety scores.ConclusionTo the best of our knowledge, the dimension of caregivers’ stress and anxiety and their association with the children's dependency level is not well documented in our region. Caregivers of children with cerebral palsy reported having mental health challenges associated with the children's visual impairment, frequent need for acute medical care, and hospital admissions. Healthcare workers should provide early and proactive planning of medical and social support for children with cerebral palsy and their families using a family-centered approach.</p
    corecore